Google Announces Solution to a Big Problem in Quantum Computing 2024-12-16 Google recently said that it had solved a major problem in supercomputing with a new generation of processor called Willow. Along with other technology companies such as Microsoft and International Business Machines (IBM), Google, whose parent company is Alphabet, is working on quantum computing. The new development promises to increase computing speeds while limiting mistakes, or errors, which are a problem with quantum computing systems. Scientists at the company's Santa Barbara, California quantum laboratory announced they had at least partly solved the error problem on December 9. On the same day, the scientific publication Nature published their paper on error correction. Google said that the Willow processor carried out a computation in under five minutes that would have taken a fast supercomputer longer than the current age of the universe to complete. The work of the Google researchers, however, does not have any commercial uses yet. But Google hopes quantum computers will one day solve problems in medicine, battery chemistry and artificial intelligence (AI) that today's computers cannot solve. The Willow chip The Willow processor, or chip, runs on units of data called "quantum bits" or "qubits" for short. Qubits permit fast computing but can easily create errors. Google researchers have a theory that qubits might be affected by subatomic particles from events in space. As more qubits are placed onto a chip, however, the errors can increase to make the chip no better than today's usual computer chips. As a result, scientists have been working on quantum error-correction since the 1990s. Hartmut Neven, who leads Google's Quantum AI group, said that Willow has 105 qubits. In the paper recently published in Nature, Google researchers said they have found a way to connect the Willow chip's qubits so that error rates go down as the number of qubits goes up. Neven also said Willow can correct errors in "real time." That is an important step toward making quantum computers useful. "We are past the break even point," Neven told Reuters. In a blog post, he said it was an unmistakable sign that "error correction is improving overall." In 2019, IBM challenged the claim that Google's quantum chip solved a problem that would take a normal computer 10,000 years to complete. IBM said the problem could be solved in two-and-a-half days using different assumptions about computer system design. Google said in its blog post that it considered some of those concerns in its newest estimates. Even under the best conditions, Google said a computer of today would still take a billion years to get the same results as its latest chip. Some of Google's competitors are producing chips with more qubits than Willow. But Anthony Megrant, who is also with Google Quantum AI, told Reuters that Google is trying to make the most dependable qubits possible. Google produced its earlier chips in a shared building at the University of California, Santa Barbara. But for the Willow chips, Google built its own special laboratory. Megrant said that the new lab will speed up the process of making future chips. One major problem is that quantum chips must be kept at low temperatures in machines that can produce extremely low temperatures called cryostats. The new laboratory permits the researchers to work more quickly. "If we have a good idea, we want somebody on the team to be able to...get into that clean room and into one of these cryostats as fast as possible, so we can get lots of cycles of learning," Megrant said. I'm Jill Robbins. Stephen Nellis reported this story for Reuters news agency. Jill Robbins adapted it for Learning English with additional information from Google and Nature. ______________________________________________ Words in This Story quantum - adj. of, relating to, or using the principles of quantum theory computation -n. an operation carried out by a computing device that is mathematical and digital in nature commercial -adj. related to business activity assumption -n. one of the basic conditions that is accepted in making a computation, but which might not exist in reality cycle - n. a repeating series of events or actions What do you think of this story? Write to us in the Comments Section.